Growth Function and VC - dimension

نویسندگان

  • Ofer Dekel
  • Xu Miao
چکیده

1 Review of VC theory Our primary interest so far is deriving the generalization bound for the binary classifiers. We have studied the Rademacher complexity techniques, and shown that the VC bound is an upper bound of the Rademacher complexity bound. For a binary classification with 0-1 loss l, we have R m (l • H) ≤ 2 log g H (m) m (1) where H = {h : X → {+1, −1}} is a hypothesis space. The growth function g H (m) is defined to be the number of ways the hypothesis space H assign an arbitrary m point sample set (Definition 1).

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Margin Analysis

In the last few lectures we have seen how to obtain high confidence bounds on the generalization error of functions learned from function classes of limited capacity, measured in terms of the growth function and VC-dimension for binary-valued function classes in the case of binary classification, and in terms of the covering numbers, pseudo-dimension, and fat-shattering dimension for real-value...

متن کامل

Covering Numbers , Pseudo - Dimension , and Fat - Shattering Dimension

So far we have seen how to obtain high confidence bounds on the generalization error er D [hS ] of a binary classifier hS learned by an algorithm from a function class H ⊆ {−1, 1}X of limited capacity, using the ideas of uniform convergence. We saw the use of the growth function ΠH(m) to measure the capacity of the class H, as well as the VC-dimension VCdim(H), which provides a one-number summa...

متن کامل

Error Bounds for Real Function Classes Based on Discretized Vapnik-Chervonenkis Dimensions

The Vapnik-Chervonenkis (VC) dimension plays an important role in statistical learning theory. In this paper, we propose the discretized VC dimension obtained by discretizing the range of a real function class. Then, we point out that Sauer’s Lemma is valid for the discretized VC dimension. We group the real function classes having the infinite VC dimension into four categories by using the dis...

متن کامل

Algorithmic Stability 3 4 Regularization Algorithms in an RKHS

In the last few lectures we have seen a number of different generalization error bounds for learning algorithms, using notions such as the growth function and VC dimension; covering numbers, pseudo-dimension, and fatshattering dimension; margins; and Rademacher averages. While these bounds are different in nature and apply in different contexts, a unifying factor that they all share is that tha...

متن کامل

VC Dimension Bounds for Higher-Order Neurons

We investigate the sample complexity for learning using higher-order neurons. We calculate upper and lower bounds on the Vapnik-Chervonenkis dimension and the pseudo dimension for higher-order neurons that allow unrestricted interactions among the input variables. In particular, we show that the degree of interaction is irrelevant for the VC dimension and that the individual degree of the varia...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011